230 research outputs found

    Extraction of fact tables from a relational database: an effort to establish rules in denormalization

    Get PDF
    Relational databases are supported by very well established models. However, some neglected problems can occur with the join operator: semantic mistakes caused by the multiple access path problem and faults when connection traps arise. In this paper we intend to identify and overcome those problems and to establish rules for relational data denormalization. Two denormalization forms are proposed and a case study is presented.info:eu-repo/semantics/publishedVersio

    FQL: An Extensible Feature Query Language and Toolkit on Searching Software Characteristics for HPC Applications

    Get PDF
    The amount of large-scale scientific computing software is dramatically increasing. In this work, we designed a new query language, named Feature Query Language (FQL), to collect and extract HPC-related software features or metadata from a quick static code analysis. We also designed and implemented an FQL-based toolkit to automatically detect and present software features using an extensible query repository. A number of large-scale, high performance computing (HPC) scientific applications have been studied in the paper with the FQL toolkit to demonstrate the HPC-related feature extraction and information/metadata collection. Different from the existing static software analysis and refactoring tools which focus on software debug, development and code transformation, the FQL toolkit is simpler, significantly lightweight and strives to collect various and diverse software metadata with ease and rapidly

    Light-Cone Quantization and Hadron Structure

    Get PDF
    In this talk, I review the use of the light-cone Fock expansion as a tractable and consistent description of relativistic many-body systems and bound states in quantum field theory and as a frame-independent representation of the physics of the QCD parton model. Nonperturbative methods for computing the spectrum and LC wavefunctions are briefly discussed. The light-cone Fock state representation of hadrons also describes quantum fluctuations containing intrinsic gluons, strangeness, and charm, and, in the case of nuclei, "hidden color". Fock state components of hadrons with small transverse size, such as those which dominate hard exclusive reactions, have small color dipole moments and thus diminished hadronic interactions; i.e., "color transparency". The use of light-cone Fock methods to compute loop amplitudes is illustrated by the example of the electron anomalous moment in QED. In other applications, such as the computation of the axial, magnetic, and quadrupole moments of light nuclei, the QCD relativistic Fock state description provides new insights which go well beyond the usual assumptions of traditional hadronic and nuclear physics.Comment: LaTex 36 pages, 3 figures. To obtain a copy, send e-mail to [email protected]

    Controlling Activity and Selectivity Using Water in the Au-Catalysed Preferential Oxidation of CO in H\u3csub\u3e2\u3c/sub\u3e

    Get PDF
    Industrial hydrogen production through methane steam reforming exceeds 50 million tons annually and accounts for 2–5% of global energy consumption. The hydrogen product, even after processing by the water–gas shift, still typically contains ∼1% CO, which must be removed for many applications. Methanation (CO + 3H2 → CH4 + H2O) is an effective solution to this problem, but consumes 5–15% of the generated hydrogen. The preferential oxidation (PROX) of CO with O2 in hydrogen represents a more-efficient solution. Supported gold nanoparticles, with their high CO-oxidation activity and notoriously low hydrogenation activity, have long been examined as PROX catalysts, but have shown disappointingly low activity and selectivity. Here we show that, under the proper conditions, a commercial Au/Al2O3 catalyst can remove CO to below 10 ppm and still maintain an O2-to-CO2 selectivity of 80–90%. The key to maximizing the catalyst activity and selectivity is to carefully control the feed-flow rate and maintain one to two monolayers of water (a key CO-oxidation co-catalyst) on the catalyst surface

    Meta-All: a system for managing metabolic pathway information

    Get PDF
    BACKGROUND: Many attempts are being made to understand biological subjects at a systems level. A major resource for these approaches are biological databases, storing manifold information about DNA, RNA and protein sequences including their functional and structural motifs, molecular markers, mRNA expression levels, metabolite concentrations, protein-protein interactions, phenotypic traits or taxonomic relationships. The use of these databases is often hampered by the fact that they are designed for special application areas and thus lack universality. Databases on metabolic pathways, which provide an increasingly important foundation for many analyses of biochemical processes at a systems level, are no exception from the rule. Data stored in central databases such as KEGG, BRENDA or SABIO-RK is often limited to read-only access. If experimentalists want to store their own data, possibly still under investigation, there are two possibilities. They can either develop their own information system for managing that own data, which is very time-consuming and costly, or they can try to store their data in existing systems, which is often restricted. Hence, an out-of-the-box information system for managing metabolic pathway data is needed. RESULTS: We have designed META-ALL, an information system that allows the management of metabolic pathways, including reaction kinetics, detailed locations, environmental factors and taxonomic information. Data can be stored together with quality tags and in different parallel versions. META-ALL uses Oracle DBMS and Oracle Application Express. We provide the META-ALL information system for download and use. In this paper, we describe the database structure and give information about the tools for submitting and accessing the data. As a first application of META-ALL, we show how the information contained in a detailed kinetic model can be stored and accessed. CONCLUSION: META-ALL is a system for managing information about metabolic pathways. It facilitates the handling of pathway-related data and is designed to help biochemists and molecular biologists in their daily research. It is available on the Web at and can be downloaded free of charge and installed locally

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    A Three-Dimensional B.I.E.M. Program

    Full text link
    The program PECET (Boundary Element Program in Three-Dimensional Elasticity) is presented in this paper. This program, written in FORTRAN V and implemen ted on a UNIVAC 1100,has more than 10,000 sentences and 96 routines and has a lot of capabilities which will be explained in more detail. The object of the program is the analysis of 3-D piecewise heterogeneous elastic domains, using a subregionalization process and 3-D parabolic isopara, metric boundary elements. The program uses special data base management which will be described below, and the modularity followed to write it gives a great flexibility to the package. The Method of Analysis includes an adaptive integration process, an original treatment of boundary conditions, a complete treatment of body forces, the utilization of a Modified Conjugate Gradient Method of solution and an original process of storage which makes it possible to save a lot of memory
    corecore